24 research outputs found

    Insight into High-quality Aerodynamic Design Spaces through Multi-objective Optimization

    Get PDF
    An approach to support the computational aerodynamic design process is presented and demonstrated through the application of a novel multi-objective variant of the Tabu Search optimization algorithm for continuous problems to the aerodynamic design optimization of turbomachinery blades. The aim is to improve the performance of a specific stage and ultimately of the whole engine. The integrated system developed for this purpose is described. This combines the optimizer with an existing geometry parameterization scheme and a well- established CFD package. The system’s performance is illustrated through case studies – one two-dimensional, one three-dimensional – in which flow characteristics important to the overall performance of turbomachinery blades are optimized. By showing the designer the trade-off surfaces between the competing objectives, this approach provides considerable insight into the design space under consideration and presents the designer with a range of different Pareto-optimal designs for further consideration. Special emphasis is given to the dimensionality in objective function space of the optimization problem, which seeks designs that perform well for a range of flow performance metrics. The resulting compressor blades achieve their high performance by exploiting complicated physical mechanisms successfully identified through the design process. The system can readily be run on parallel computers, substantially reducing wall-clock run times – a significant benefit when tackling computationally demanding design problems. Overall optimal performance is offered by compromise designs on the Pareto trade-off surface revealed through a true multi-objective design optimization test case. Bearing in mind the continuing rapid advances in computing power and the benefits discussed, this approach brings the adoption of such techniques in real-world engineering design practice a ste

    Is telomere length socially patterned? Evidence from the West of Scotland Twenty-07 study

    Get PDF
    Lower socioeconomic status (SES) is strongly associated with an increased risk of morbidity and premature mortality, but it is not known if the same is true for telomere length, a marker often used to assess biological ageing. The West of Scotland Twenty-07 Study was used to investigate this and consists of three cohorts aged approximately 35 (N = 775), 55 (N = 866) and 75 years (N = 544) at the time of telomere length measurement. Four sets of measurements of SES were investigated: those collected contemporaneously with telomere length assessment, educational markers, SES in childhood and SES over the preceding twenty years. We found mixed evidence for an association between SES and telomere length. In 35-year-olds, many of the education and childhood SES measures were associated with telomere length, i.e. those in poorer circumstances had shorter telomeres, as was intergenerational social mobility, but not accumulated disadvantage. A crude estimate showed that, at the same chronological age, social renters, for example, were nine years (biologically) older than home owners. No consistent associations were apparent in those aged 55 or 75. There is evidence of an association between SES and telomere length, but only in younger adults and most strongly using education and childhood SES measures. These results may reflect that childhood is a sensitive period for telomere attrition. The cohort differences are possibly the result of survival bias suppressing the SES-telomere association; cohort effects with regard different experiences of SES; or telomere possibly being a less effective marker of biological ageing at older ages

    Effect of a Perioperative, Cardiac Output-Guided Hemodynamic Therapy Algorithm on Outcomes Following Major Gastrointestinal Surgery A Randomized Clinical Trial and Systematic Review

    Get PDF
    Importance: small trials suggest that postoperative outcomes may be improved by the use of cardiac output monitoring to guide administration of intravenous fluid and inotropic drugs as part of a hemodynamic therapy algorithm.Objective: to evaluate the clinical effectiveness of a perioperative, cardiac output–guided hemodynamic therapy algorithm.Design, setting, and participants: OPTIMISE was a pragmatic, multicenter, randomized, observer-blinded trial of 734 high-risk patients aged 50 years or older undergoing major gastrointestinal surgery at 17 acute care hospitals in the United Kingdom. An updated systematic review and meta-analysis were also conducted including randomized trials published from 1966 to February 2014.Interventions: patients were randomly assigned to a cardiac output–guided hemodynamic therapy algorithm for intravenous fluid and inotrope (dopexamine) infusion during and 6 hours following surgery (n=368) or to usual care (n=366).Main outcomes and measures: the primary outcome was a composite of predefined 30-day moderate or major complications and mortality. Secondary outcomes were morbidity on day 7; infection, critical care–free days, and all-cause mortality at 30 days; all-cause mortality at 180 days; and length of hospital stay.Results: baseline patient characteristics, clinical care, and volumes of intravenous fluid were similar between groups. Care was nonadherent to the allocated treatment for less than 10% of patients in each group. The primary outcome occurred in 36.6% of intervention and 43.4% of usual care participants (relative risk [RR], 0.84 [95% CI, 0.71-1.01]; absolute risk reduction, 6.8% [95% CI, ?0.3% to 13.9%]; P?=?.07). There was no significant difference between groups for any secondary outcomes. Five intervention patients (1.4%) experienced cardiovascular serious adverse events within 24 hours compared with none in the usual care group. Findings of the meta-analysis of 38 trials, including data from this study, suggest that the intervention is associated with fewer complications (intervention, 488/1548 [31.5%] vs control, 614/1476 [41.6%]; RR, 0.77 [95% CI, 0.71-0.83]) and a nonsignificant reduction in hospital, 28-day, or 30-day mortality (intervention, 159/3215 deaths [4.9%] vs control, 206/3160 deaths [6.5%]; RR, 0.82 [95% CI, 0.67-1.01]) and mortality at longest follow-up (intervention, 267/3215 deaths [8.3%] vs control, 327/3160 deaths [10.3%]; RR, 0.86 [95% CI, 0.74-1.00]).Conclusions and relevance: in a randomized trial of high-risk patients undergoing major gastrointestinal surgery, use of a cardiac output–guided hemodynamic therapy algorithm compared with usual care did not reduce a composite outcome of complications and 30-day mortality. However, inclusion of these data in an updated meta-analysis indicates that the intervention was associated with a reduction in complication rate

    The First Post-Kepler Brightness Dips of KIC 8462852

    Get PDF
    We present a photometric detection of the first brightness dips of the unique variable star KIC 8462852 since the end of the Kepler space mission in 2013 May. Our regular photometric surveillance started in October 2015, and a sequence of dipping began in 2017 May continuing on through the end of 2017, when the star was no longer visible from Earth. We distinguish four main 1-2.5% dips, named "Elsie," "Celeste," "Skara Brae," and "Angkor", which persist on timescales from several days to weeks. Our main results so far are: (i) there are no apparent changes of the stellar spectrum or polarization during the dips; (ii) the multiband photometry of the dips shows differential reddening favoring non-grey extinction. Therefore, our data are inconsistent with dip models that invoke optically thick material, but rather they are in-line with predictions for an occulter consisting primarily of ordinary dust, where much of the material must be optically thin with a size scale <<1um, and may also be consistent with models invoking variations intrinsic to the stellar photosphere. Notably, our data do not place constraints on the color of the longer-term "secular" dimming, which may be caused by independent processes, or probe different regimes of a single process

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    The First Post-Kepler Brightness Dips of KIC 8462852

    Full text link

    QUANTUM ANNEALING OPTIMIZATION OF A HEURISTIC SURROGATE MODEL FOR PWR FUEL LOADING

    Get PDF
    An efficient fuel arrangement must be generated by PWR operators every 6–18 months. This complex problem has been extensively researched with two broad approaches, heuristic and stochastic methods, becoming accepted. This initial study qualitatively introduces the concept of encoding full-core PWR fuel loading patterns in a form suitable for quantum annealing. The concepts of adiabatic quantum computers and quantum annealing are introduced, and a surrogate model encoding of a set of heuristics for loading pattern design produced in a form suitable for use in present-day quantum annealers. The simulated results show significant similarity to benchmark loading patterns

    QUANTUM ANNEALING OPTIMIZATION OF A HEURISTIC SURROGATE MODEL FOR PWR FUEL LOADING

    No full text
    An efficient fuel arrangement must be generated by PWR operators every 6–18 months. This complex problem has been extensively researched with two broad approaches, heuristic and stochastic methods, becoming accepted. This initial study qualitatively introduces the concept of encoding full-core PWR fuel loading patterns in a form suitable for quantum annealing. The concepts of adiabatic quantum computers and quantum annealing are introduced, and a surrogate model encoding of a set of heuristics for loading pattern design produced in a form suitable for use in present-day quantum annealers. The simulated results show significant similarity to benchmark loading patterns

    SURROGATE MODEL OPTIMIZATION OF A ‘MICRO CORE’ PWR FUEL ASSEMBLY ARRANGEMENT USING DEEP LEARNING MODELS

    Get PDF
    This paper investigates the applicability of surrogate model optimization (SMO) using deep learning regression models to automatically embed knowledge about the objective function into the optimization process. This paper demonstrates two deep learning SMO methods for calculating simple neutronics parameters. Using these models, SMO returns results comparable with those from the early stages of direct iterative optimization. However, for this study, the cost of creating the training set outweighs the benefits of the surrogate models
    corecore